\[\textrm{posterior} = \frac{\textrm{prior} \cdot \textrm{likelihood}}{\textrm{normalizing constant}}\]
Many ways to sample
The posterior contains simultaneous samples for each parameter.
“Bad” jumps are possible with non-zero probability
Use dnorm() to get the normal density for two observations of \(y\) given the current values of \(\mu\) and \(\sigma\)
[1] -2.488659
[1] -9.043736
Model likelihood is the product of the probabilities for all the observations
Add up the logged probabilities for each observation (log = TRUE):
Now what?
Take the ratio of the likelihoods by subtraction
runif(1)runif(1), make the movefor (ii in 2:n_samp) {
LL_current <- sum(dnorm(y, mean = s[ii - 1, 1], sd = s[ii - 1, 2],
log = TRUE))
mu_prop <- s[ii - 1, 1] + rnorm(1, 0, 1)
sigma_prop <- s[ii - 1, 2] + rnorm(1, 0, 1)
if (sigma_prop < 0) sigma_prop <- abs(sigma_prop)
LL_prop <- sum(dnorm(y, mean = mu_prop, sd = sigma_prop, log = TRUE))
if (exp(LL_prop - LL_current) > runif(1)) {
s[ii, ] <- c(mu_prop, sigma_prop)
mu <- mu_prop
sigma <- sigma_prop
n_accepted <- n_accepted + 1
} else {
s[ii, ] <- s[ii - 1, ]
n_rejected <- n_rejected + 1
}
} [,1] [,2]
[1,] 1.0116750 0.5373229
[2,] -0.9489727 0.9602566
[3,] -0.9489727 0.9602566
[4,] -0.9489727 0.9602566
[5,] -0.9489727 0.9602566
[6,] -0.9489727 0.9602566
[7,] 0.7625157 2.2189009
[8,] 0.7625157 2.2189009
[9,] 1.7529252 2.9303900
[10,] 0.2081370 3.3238409
[1] 3822
[1] 6177
[,1] [,2]
Lag 0 1.0000000 1.0000000
Lag 1 0.9955104 0.9978622
Lag 5 0.9800271 0.9919311
Lag 10 0.9647656 0.9866262
Lag 50 0.8500191 0.9466300
s <- s[(n_samp * 0.25):n_samp, ] # Drop 25% for burnin
s_thin <- s[seq(1, nrow(s), by = 100), ] # Thin every 100th sample
s_mcmc <- as.mcmc(s_thin)
autocorr.diag(s_mcmc) [,1] [,2]
Lag 0 1.000000000 1.0000000000
Lag 1 -0.000697156 0.0009217929
Lag 5 -0.008787172 -0.0045931536
Lag 10 0.019425561 0.0142671041
Lag 50 0.005370590 -0.0066584641
Leave it to the professionals.